Upper and Lower Bounds on New F-Divergence in Terms of Relative J-Divergence Measure

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized Relative J-Divergence Measure and Properties

We have considered one parametric generalization of the nonsymmetric relative J-divergence measure. The generalized measure is shown belonging to the Csiszár’s f-divergence class. Further, we have derived bounds for the generalized measure in terms of well known divergence measures.

متن کامل

Bounds on Nonsymmetric Divergence Measure in terms of Other Symmetric and Nonsymmetric Divergence Measures

Vajda (1972) studied a generalized divergence measure of Csiszar's class, so called "Chi-m divergence measure." Variational distance and Chi-square divergence are the special cases of this generalized divergence measure at m = 1 and m = 2, respectively. In this work, nonparametric nonsymmetric measure of divergence, a particular part of Vajda generalized divergence at m = 4, is taken and charac...

متن کامل

Lower bounds on Information Divergence

In this paper we establish lower bounds on information divergence from a distribution to certain important classes of distributions as Gaussian, exponential, Gamma, Poisson, geometric, and binomial. These lower bounds are tight and for several convergence theorems where a rate of convergence can be computed, this rate is determined by the lower bounds proved in this paper. General techniques fo...

متن کامل

A Symmetric Information Divergence Measure of the Csisz r's f -Divergence Class and Its Bounds

K e y w o r d s P a r a m e t r i c measure, Nonparametric measure, Csiszg~r's f-divergence, Information measure. 1. I N T R O D U C T I O N There are several types of information divergence measures s tudied in l i te ra ture which compare two probabi l i ty d is t r ibut ions and have appl icat ions in informat ion theory, s tat is t ics and engineering. A convenient classification to differe...

متن کامل

Empirically Estimable Classification Bounds Based on a New Divergence Measure

Information divergence functions play a critical role in statistics and information theory. In this paper we show that a non-parametric f -divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between tr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Applied & Computational Mathematics

سال: 2013

ISSN: 2168-9679

DOI: 10.4172/2168-9679.1000131